AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
MOE-128 expert architecture

# MOE-128 expert architecture

Qwen3 42B A3B Stranger Thoughts Deep20X GGUF
Apache-2.0
A 42B parameter MOE architecture model upgraded from the Qwen3-30B-A3B model, which enhances creative writing and programming abilities through Brainstorm 20x technology
Large Language Model Transformers Supports Multiple Languages
Q
DavidAU
411
1
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase